Skip to content

MkLlm

Show source on GitHub

Node for LLM-based text generation.

Example: Regular

Jinja

{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}

Python

MkLlm('Write a poem about MkDocs', [])

Ode to MkDocs

In realms where knowledge flows like streams,
A treasure trove of structured dreams,
MkDocs rises, sleek and bright,
A beacon guiding through the night.

With Markdown's grace, it takes its form,
A canvas where ideas can transform,
Simple syntax, yet so profound,
In organized lines, wisdom is found.

From project docs to wikis wide,
It supports them all, a faithful guide,
A static site, fast and free,
Unlocking the power of community.

Themes to wear, like cloaks of lore,
Inviting all to learn and explore,
Responsive and clean, it stands the test,
A reliable friend in the knowledge quest.

Version control, the artist's touch,
Keeping changes without a fuss,
Collaborative spirits, unite in code,
As Markdown dances, knowledge flowed.

So here's to you, O MkDocs dear,
For simplifying paths, both far and near,
In every line, a story's told,
In your embrace, our wisdom unfolds.

### Ode to MkDocs

In realms where knowledge flows like streams,  
A treasure trove of structured dreams,  
MkDocs rises, sleek and bright,  
A beacon guiding through the night.  

With Markdown's grace, it takes its form,  
A canvas where ideas can transform,  
Simple syntax, yet so profound,  
In organized lines, wisdom is found.  

From project docs to wikis wide,  
It supports them all, a faithful guide,  
A static site, fast and free,  
Unlocking the power of community.  

Themes to wear, like cloaks of lore,  
Inviting all to learn and explore,  
Responsive and clean, it stands the test,  
A reliable friend in the knowledge quest.   

Version control, the artist's touch,  
Keeping changes without a fuss,  
Collaborative spirits, unite in code,  
As Markdown dances, knowledge flowed.  

So here's to you, O MkDocs dear,  
For simplifying paths, both far and near,  
In every line, a story's told,  
In your embrace, our wisdom unfolds.  
<h3 id="ode-to-mkdocs">Ode to MkDocs</h3>
<p>In realms where knowledge flows like streams,<br>
A treasure trove of structured dreams,<br>
MkDocs rises, sleek and bright,<br>
A beacon guiding through the night.  </p>
<p>With Markdown's grace, it takes its form,<br>
A canvas where ideas can transform,<br>
Simple syntax, yet so profound,<br>
In organized lines, wisdom is found.  </p>
<p>From project docs to wikis wide,<br>
It supports them all, a faithful guide,<br>
A static site, fast and free,<br>
Unlocking the power of community.  </p>
<p>Themes to wear, like cloaks of lore,<br>
Inviting all to learn and explore,<br>
Responsive and clean, it stands the test,<br>
A reliable friend in the knowledge quest.   </p>
<p>Version control, the artist's touch,<br>
Keeping changes without a fuss,<br>
Collaborative spirits, unite in code,<br>
As Markdown dances, knowledge flowed.  </p>
<p>So here's to you, O MkDocs dear,<br>
For simplifying paths, both far and near,<br>
In every line, a story's told,<br>
In your embrace, our wisdom unfolds.  </p>

Bases: MkText

text property

text: str

__init__

__init__(
    user_prompt: str,
    system_prompt: str | None = None,
    model: str = "openai:gpt-4o-mini",
    context: str | None = None,
    extra_files: Sequence[str | PathLike[str]] | None = None,
    **kwargs: Any
)

Parameters:

Name Type Description Default
user_prompt str

Main prompt for the LLM

required
system_prompt str | None

System prompt to set LLM behavior

None
model str

LLM model identifier to use

'openai:gpt-4o-mini'
context str | None

Main context string

None
extra_files Sequence[str | PathLike[str]] | None

Additional context files or strings

None
kwargs Any

Keyword arguments passed to parent

{}
Name Children Inherits
MkText
mknodes.basenodes.mktext
Class for any Markup text.
graph TD
  93860739174608["mkllm.MkLlm"]
  93860742935328["mktext.MkText"]
  93860744073616["mknode.MkNode"]
  93860744080896["node.Node"]
  140589822947552["builtins.object"]
  93860742935328 --> 93860739174608
  93860744073616 --> 93860742935328
  93860744080896 --> 93860744073616
  140589822947552 --> 93860744080896
/home/runner/work/mknodes/mknodes/mknodes/templatenodes/mkllm/metadata.toml
[metadata]
icon = "mdi:view-grid"
status = "new"
name = "MkLlm"

[examples.regular]
title = "Regular"
jinja = """
{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}
"""

# [output.markdown]
# template = """
# <div class="grid cards" markdown="1">

# {% for item in node.items %}
# -   {{ item | indent }}
# {% endfor %}
# </div>
# """
mknodes.templatenodes.mkllm.MkLlm
class MkLlm(mktext.MkText):
    """Node for LLM-based text generation."""

    ICON = "material/format-list-group"
    REQUIRED_PACKAGES = [resources.Package("litellm")]

    def __init__(
        self,
        user_prompt: str,
        system_prompt: str | None = None,
        model: str = "openai:gpt-4o-mini",
        context: str | None = None,
        extra_files: Sequence[str | os.PathLike[str]] | None = None,
        **kwargs: Any,
    ):
        """Constructor.

        Args:
            user_prompt: Main prompt for the LLM
            system_prompt: System prompt to set LLM behavior
            model: LLM model identifier to use
            context: Main context string
            extra_files: Additional context files or strings
            kwargs: Keyword arguments passed to parent
        """
        super().__init__(**kwargs)
        self.user_prompt = user_prompt
        self.system_prompt = system_prompt
        self._model = model
        self._context = context
        self._extra_files = extra_files or []

    def _process_extra_files(self) -> list[str]:
        """Process extra context items, reading files if necessary.

        Returns:
            List of context strings.
        """
        context_items: list[str] = []

        def process_dir(path: UPath) -> list[str]:
            return [f.read_text() for f in path.rglob("*") if f.is_file()]

        for item in self._extra_files:
            try:
                path = UPath(item)
                if path.is_file():
                    context_items.append(path.read_text())
                elif path.is_dir():
                    context_items.extend(process_dir(path))
                else:
                    context_items.append(str(item))
            except Exception as exc:
                err_msg = f"Failed to read context file: {item}"
                logger.warning(err_msg)
                raise ValueError(err_msg) from exc

        return context_items

    @property
    def text(self) -> str:
        """Generate text using the LLM.

        Returns:
            Generated text content.
        """
        context_items = self._process_extra_files()
        combined_context = (
            "\n".join(filter(None, [self._context, *context_items])) or None
        )

        return complete_llm(
            self.user_prompt,
            self.system_prompt or "",
            model=self._model,
            context=combined_context or "",
        )